154 research outputs found

    Efficient epidemic multicast in heterogeneous networks

    Get PDF
    The scalability and resilience of epidemic multicast, also called probabilistic or gossip-based multicast, rests on its symmetry: Each participant node contributes the same share of bandwidth thus spreading the load and allowing for redundancy. On the other hand, the symmetry of gossiping means that it does not avoid nodes or links with less capacity. Unfortunately, one cannot naively avoid such symmetry without also endangering scalability and resilience. In this paper we point out how to break out of this dilemma, by lazily deferring message transmission according to a configurable policy. An experimental proof-of-concept illustrates the approach.Fundação para a CiĂȘncia e a Tecnologia (FCT) - Project “P-SON: Probabilistically Structured Overlay Networks” (POS C/EIA/60941/2004)

    Large-Scale Newscast Computing on the Internet

    Get PDF

    Newscast Computing

    Get PDF

    Autonomous multi-dimensional slicing for large-scale distributed systems

    Get PDF
    Slicing is a distributed systems primitive that allows to autonomously partition a large set of nodes based on node-local attributes. Slicing is decisive for automatically provisioning system resources for different services, based on their requirements or importance. One of the main limitations of existing slicing protocols is that only single dimension attributes are considered for partitioning. In practical settings, it is often necessary to consider best compromises for an ensemble of metrics. In this paper we propose an extension of the slicing primitive that allows multi-attribute distributed systems slicing. Our protocol employs a gossip-based approach that does not require centralized knowledge and allows self-organization. It leverages the notion of domination between nodes, forming a partial order between multi-dimensional points, in a similar way to SkyLine queries for databases. We evaluate and demonstrate the interest of our approach using large-scale simulations.This work received support from the Portuguese Foundation for Science and Technology under grant SFRH/BD/71476/2010

    Reliability and performance of UEGO, a clustering-based global optimizer

    Get PDF
    Den nya tullagen Union Customs Code (UCC) med mÄlet elektroniskt tullhantering inom unionen kommer att börja tillÀmpas 1 maj 2016. Tullkodex för unionen innebÀr elektroniskt tullhantering sÄvÀl mellan nÀringslivet och Tullverket som mellan medlemsstaternas tulladministrationer. Syftet med lagen Àr att förenkla uppgiftslÀmning och tullhantering för företag vilket krÀver omfattande uppgradering av IT-lösningar för bÄde nÀringslivet och Tullverket. Kvalitet pÄ de uppgifter som överförs till Tullverkets tull-system (TDS) antas ha större betydelse Àn tidigare dÄ det genom UCC lÀggs en större vikt pÄ spÄrbarhet.Organisationer bör dÀrför optimera kvaliteten pÄ de uppgifter som i samband med deklarationer lÀmnas till Tullverket. Detta för att undvika eventuella problem i framtiden. Optimal datakvalitet erhÄlls genom effektiva administrativa processer. Denna uppsats har dÀrför genom en fallstudie studerat faktorer som förhindrar ett effektivt administrativt arbete. Resultatet visar att affÀrssystemets undermÄliga funktionalitet, datakvalitetens opÄlitlighet, utebliven intern kommunikation samt avsaknad av standardiserade arbetsrutiner pÄverkar effektiviteten pÄ de administrativa processerna negativt.The new customs regulation, Union Customs Code (UCC), with the goal of electronic customs handling within the European Union will apply 1 May 2016. The Union Customs Code means electronic customs handling both between business and customs Administration and between member states' customs administrations. The law aims to simplify disclosure and customs management for businesses, which requires an extensive upgrade of IT solutions for both business and customs administrations. Quality of the data transferred to Customs' Tariff System (TDS) is assumed will havegreater importance than today as the UCC places a greater emphasis on traceability. Organisations should therefore optimize the quality of the data associated with declarations submitted to Customs. It is to avoid any future problems. Optimal data quality is achieved through efficient administrative processes. This paper has therefore through a case study, studied factors that prevent efficient administrative processes. The results show that the business system's substandard functionality, data quality's unreliability, loss of internal communication and a lack of standard in operating procedures are affecting the efficiency of the administrative processes negatively

    UEGO, an abstract niching technique for global optimization

    Full text link

    Robust epidemic aggregation under churn

    Get PDF
    In large-scale distributed systems data aggregation is a fundamental task that provides a global synopsis over a distributed set of data values. Epidemic protocols are based on a randomised communication paradigm inspired by biological systems and have been proposed to provide decentralised, scalable and fault-tolerant solutions to the data aggregation problem. However, in epidemic aggregation, nodes failure and churn have a detrimental effect on the accuracy of the local estimates of the global aggregation target. In this paper, a novel approach, the Robust Epidemic Aggregation Protocol (REAP), is proposed to provide robustness in the presence of churn by detecting three distinct phases in the aggregation process. An analysis of the impact of each phase over the estimation accuracy is provided. In particular, a novel mechanism is introduced to improve the phase that is most critical for the protocol accuracy. REAP is validated by means of simulations and is shown to achieve convergence with a good level of accuracy for a reasonable range of node churn rates
    • 

    corecore